# Character-level tokenization
Charllama 35M
Openrail
CharLLaMa-35M is a miniature language model based on the LLaMa architecture, featuring character-level tokenization, suitable for various experimental scenarios where BPE tokenization underperforms.
Large Language Model
Transformers Other

C
inkoziev
61
5
Chargpt 96M
Openrail
CharGPT-96M is a small language model that uses character-level tokenization, suitable for various experimental scenarios, especially when BPE (subword) tokenization leads to poor task performance.
Large Language Model
Transformers Other

C
inkoziev
70
3
Featured Recommended AI Models